2025-06-13 14:17:09,641 [ 171629 ] INFO : ClickHouse root is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse (runner:53, check_args_and_update_paths) 2025-06-13 14:17:09,641 [ 171629 ] INFO : Cases dir is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration (runner:79, check_args_and_update_paths) 2025-06-13 14:17:09,641 [ 171629 ] INFO : utils dir is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse/utils (runner:90, check_args_and_update_paths) 2025-06-13 14:17:09,641 [ 171629 ] INFO : base_configs_dir: /home/ubuntu/_work/ClickHouse/ClickHouse/programs/server, binary: /home/ubuntu/_work/_temp/test/build/clickhouse, cases_dir: /home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration (runner:92, check_args_and_update_paths) clickhouse_integration_tests_volume Running pytest container as: 'docker run --rm --name clickhouse_integration_tests_tu7fxk --privileged --dns-search='.' --memory=30709030912 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/backupview:/ClickHouse/utils/backupview --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/grpc-client/pb2:/ClickHouse/utils/grpc-client/pb2 --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_DOTNET_CLIENT_TAG=11de0b29a15d -e DOCKER_HELPER_TAG=5dc43a6382f0 -e DOCKER_BASE_TAG=5ccda723c1fc -e DOCKER_KERBEROS_KDC_TAG=9391ecdee8d7 -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=9bec2a638e6e -e DOCKER_MYSQL_JAVA_CLIENT_TAG=766bff31cfe4 -e DOCKER_MYSQL_JS_CLIENT_TAG=41ba7c2ec2a1 -e DOCKER_MYSQL_PHP_CLIENT_TAG=88be89c1e3b6 -e DOCKER_NGINX_DAV_TAG=b55ac9cd7519 -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=a4eff5c7f4d6 -e DOCKER_PYTHON_BOTTLE_TAG=d862517635bf -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e PYTHONUNBUFFERED=1 -e PYTEST_ADDOPTS="--dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_backward_compatibility/test_functions.py::test_string_functions test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv " altinityinfra/integration-tests-runner:ad96270260ff '. Start tests ============================= test session starts ============================== platform linux -- Python 3.10.12, pytest-7.4.4, pluggy-1.5.0 -- /usr/bin/python3 cachedir: .pytest_cache Test order randomisation NOT enabled. Enable with --random-order or --random-order-bucket= rootdir: /ClickHouse/tests/integration configfile: pytest.ini plugins: timeout-2.3.1, repeat-0.9.3, order-1.0.0, reportlog-0.4.0, xdist-3.5.0, random-order-1.1.1 timeout: 900.0s timeout method: signal timeout func_only: False created: 10/10 workers 10 workers [4 items] scheduling tests via LoadFileScheduling test_database_delta/test.py::test_complex_table_schema test_backward_compatibility/test_functions.py::test_string_functions [gw2] [ 25%] SKIPPED test_backward_compatibility/test_functions.py::test_string_functions [gw0] [ 50%] FAILED test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables [gw0] [ 75%] FAILED test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables [gw0] [100%] FAILED test_database_delta/test.py::test_multiple_schemes_tables =================================== FAILURES =================================== __________________________ test_complex_table_schema ___________________________ [gw0] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_complex_table_schema(started_cluster): node1 = started_cluster.instances['node1'] execute_spark_query(node1, "CREATE SCHEMA schema_with_complex_tables", ignore_exit_code=True) schema = "event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT" create_query = f"CREATE TABLE schema_with_complex_tables.complex_table ({schema}) using Delta location '/tmp/complex_schema/complex_table'" execute_spark_query(node1, create_query, ignore_exit_code=True) execute_spark_query(node1, "insert into schema_with_complex_tables.complex_table SELECT to_date('2024-10-01', 'yyyy-MM-dd'), to_timestamp('2024-10-01 00:12:00'), array(42, 123, 77), map(7, 'v7', 5, 'v5'), named_struct(\\\"f1\\\", 34, \\\"f2\\\", 'hello')", ignore_exit_code=True) node1.query("create database complex_schema engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) > complex_schema_tables = list(sorted(node1.query("SHOW TABLES FROM complex_schema LIKE 'schema_with_complex_tables%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) test_database_delta/test.py:121: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:3571: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_answer(self): self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) self.stdout_file.seek(0) self.stderr_file.seek(0) stdout = self.stdout_file.read().decode("utf-8", errors="replace") stderr = self.stderr_file.read().decode("utf-8", errors="replace") if ( self.timer is not None and not self.process_finished_before_timeout and not self.ignore_error ): logging.debug(f"Timed out. Last stdout:{stdout}, stderr:{stderr}") raise QueryTimeoutExceedException("Client timed out!") if ( self.process.returncode != 0 or self.remove_trash_from_stderr(stderr) ) and not self.ignore_error: > raise QueryRuntimeException( "Client failed! Return code: {}, stderr: {}".format( self.process.returncode, stderr ), self.process.returncode, stderr, ) E helpers.client.QueryRuntimeException: Client failed! Return code: 233, stderr: Received exception from server (version 25.3.3): E Code: 1001. DB::Exception: Received from 172.16.1.2:9000. DB::Exception: std::__1::filesystem::filesystem_error: filesystem error: in directory_iterator::directory_iterator(...): No such file or directory ["/tmp/marksheet_uniform/_delta_log"]. Stack trace: E E 0. ./contrib/llvm-project/libcxx/include/__exception/exception.h:106: std::runtime_error::runtime_error(String const&) @ 0x000000003fabed4e E 1. ./build_docker/./contrib/llvm-project/libcxx/src/system_error.cpp:195: std::system_error::system_error(std::error_code, String const&) @ 0x000000003fad3d7a E 2. ./contrib/llvm-project/libcxx/include/__filesystem/filesystem_error.h:38: std::filesystem::filesystem_error::filesystem_error[abi:ne190107](String const&, std::filesystem::path const&, std::error_code) @ 0x000000001bdf8b03 E 3. ./contrib/llvm-project/libcxx/include/__filesystem/filesystem_error.h:74: void std::filesystem::__throw_filesystem_error[abi:ne190107](String&, std::filesystem::path const&, std::error_code const&) @ 0x000000003f9eb21d E 4. ./contrib/llvm-project/libcxx/src/filesystem/error.h:160: std::filesystem::detail::ErrorHandler::report(std::error_code const&) const @ 0x000000003f9e92fa E 5. ./build_docker/./contrib/llvm-project/libcxx/src/filesystem/directory_iterator.cpp:180: std::filesystem::directory_iterator::directory_iterator(std::filesystem::path const&, std::error_code*, std::filesystem::directory_options) @ 0x000000003f9e6a3a E 6. ./contrib/llvm-project/libcxx/include/__filesystem/directory_iterator.h:53: DB::LocalObjectStorage::listObjects(String const&, std::vector, std::allocator>>&, unsigned long) const @ 0x0000000026aef73c E 7. ./build_docker/./src/Storages/ObjectStorage/DataLakes/Common.cpp:18: DB::listFiles(DB::IObjectStorage const&, DB::StorageObjectStorage::Configuration const&, String const&, String const&) @ 0x00000000267bb5ff E 8. ./build_docker/./src/Storages/ObjectStorage/DataLakes/DeltaLakeMetadata.cpp:184: DB::DeltaLakeMetadataImpl::processMetadataFiles() const @ 0x0000000026774fe0 E 9. ./build_docker/./src/Storages/ObjectStorage/DataLakes/DeltaLakeMetadata.cpp:599: DB::DeltaLakeMetadata::DeltaLakeMetadata(std::shared_ptr, std::weak_ptr, std::shared_ptr) @ 0x000000002676e008 E 10. ./contrib/llvm-project/libcxx/include/__memory/unique_ptr.h:634: DB::DeltaLakeMetadata::create(std::shared_ptr, std::weak_ptr, std::shared_ptr) @ 0x000000002113f060 E 11. ./src/Storages/ObjectStorage/DataLakes/DataLakeConfiguration.h:235: DB::DataLakeConfiguration::updateMetadataObjectIfNeeded(std::shared_ptr, std::shared_ptr) @ 0x00000000282fda61 E 12. ./src/Storages/ObjectStorage/DataLakes/DataLakeConfiguration.h:65: DB::DataLakeConfiguration::update(std::shared_ptr, std::shared_ptr) @ 0x00000000282fd583 E 13. ./build_docker/./src/Storages/ObjectStorage/StorageObjectStorage.cpp:114: DB::StorageObjectStorage::StorageObjectStorage(std::shared_ptr, std::shared_ptr, std::shared_ptr, DB::StorageID const&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional, DB::LoadingStrictnessLevel, bool, std::shared_ptr, bool, bool, std::optional)::$_0::operator()() const @ 0x0000000026577415 E 14. ./build_docker/./src/Storages/ObjectStorage/StorageObjectStorage.cpp:133: DB::StorageObjectStorage::StorageObjectStorage(std::shared_ptr, std::shared_ptr, std::shared_ptr, DB::StorageID const&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional, DB::LoadingStrictnessLevel, bool, std::shared_ptr, bool, bool, std::optional) @ 0x0000000026575783 E 15. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:41: DB::StorageObjectStorage* std::construct_at[abi:ne190107]&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool, std::shared_ptr&, bool, bool, String&, DB::StorageObjectStorage*>(DB::StorageObjectStorage*, std::shared_ptr&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool&&, std::shared_ptr&, bool&&, bool&&, String&) @ 0x00000000265a8a57 E 16. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:49: std::shared_ptr std::allocate_shared[abi:ne190107], std::shared_ptr&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool, std::shared_ptr&, bool, bool, String&, 0>(std::allocator const&, std::shared_ptr&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool&&, std::shared_ptr&, bool&&, bool&&, String&) @ 0x00000000265a8456 E 17. ./contrib/llvm-project/libcxx/include/__memory/shared_ptr.h:851: DB::StorageObjectStorageCluster::StorageObjectStorageCluster(String const&, std::shared_ptr, std::shared_ptr, std::shared_ptr, DB::StorageID const&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional, DB::LoadingStrictnessLevel, std::shared_ptr) @ 0x0000000026597adf E 18. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:41: DB::StorageObjectStorageCluster* std::construct_at[abi:ne190107] const&, std::shared_ptr, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription, char const (&) [1], DB::FormatSettings, DB::LoadingStrictnessLevel, std::nullptr_t, DB::StorageObjectStorageCluster*>(DB::StorageObjectStorageCluster*, String&, std::shared_ptr const&, std::shared_ptr&&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription&&, char const (&) [1], DB::FormatSettings&&, DB::LoadingStrictnessLevel&&, std::nullptr_t&&) @ 0x00000000282ff781 E 19. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:49: std::shared_ptr std::allocate_shared[abi:ne190107], String&, std::shared_ptr const&, std::shared_ptr, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription, char const (&) [1], DB::FormatSettings, DB::LoadingStrictnessLevel, std::nullptr_t, 0>(std::allocator const&, String&, std::shared_ptr const&, std::shared_ptr&&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription&&, char const (&) [1], DB::FormatSettings&&, DB::LoadingStrictnessLevel&&, std::nullptr_t&&) @ 0x00000000282ff249 E 20. ./contrib/llvm-project/libcxx/include/__memory/shared_ptr.h:851: DB::DatabaseDataLake::tryGetTableImpl(String const&, std::shared_ptr, bool) const @ 0x00000000282ebfbc E 21. ./build_docker/./src/Databases/DataLake/DatabaseDataLake.cpp:462: DB::DatabaseDataLake::getLightweightTablesIterator(std::shared_ptr, std::function const&, bool) const @ 0x00000000282eeb9c E 22. ./build_docker/./src/Storages/System/StorageSystemTables.cpp:123: DB::detail::getFilteredTables(DB::ActionsDAG::Node const*, COW::immutable_ptr const&, std::shared_ptr, bool) @ 0x00000000212960dd E 23. ./build_docker/./src/Storages/System/StorageSystemTables.cpp:844: DB::ReadFromSystemTables::applyFilters(DB::ActionDAGNodes) @ 0x000000002129fff2 E 24. ./src/Processors/QueryPlan/SourceStepWithFilter.h:39: DB::SourceStepWithFilterBase::applyFilters() @ 0x0000000030ae0157 E 25. ./build_docker/./src/Processors/QueryPlan/Optimizations/optimizePrimaryKeyConditionAndLimit.cpp:55: DB::QueryPlanOptimizations::optimizePrimaryKeyConditionAndLimit(std::vector> const&) @ 0x0000000030adfc78 E 26. ./build_docker/./src/Processors/QueryPlan/Optimizations/optimizeTree.cpp:125: DB::QueryPlanOptimizations::optimizeTreeSecondPass(DB::QueryPlanOptimizationSettings const&, DB::QueryPlan::Node&, std::list>&) @ 0x0000000030adc361 E 27. ./build_docker/./src/Processors/QueryPlan/QueryPlan.cpp:487: DB::QueryPlan::buildQueryPipeline(DB::QueryPlanOptimizationSettings const&, DB::BuildQueryPipelineSettings const&) @ 0x000000003091ebbc E 28. ./build_docker/./src/Interpreters/InterpreterSelectQueryAnalyzer.cpp:275: DB::InterpreterSelectQueryAnalyzer::buildQueryPipeline() @ 0x000000002a33a102 E 29. ./build_docker/./src/Interpreters/InterpreterSelectQueryAnalyzer.cpp:242: DB::InterpreterSelectQueryAnalyzer::execute() @ 0x000000002a3398c7 E 30. ./build_docker/./src/Interpreters/executeQuery.cpp:1457: DB::executeQueryImpl(char const*, char const*, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum, DB::ReadBuffer*, std::shared_ptr&) @ 0x000000002aac50e3 E 31. ./build_docker/./src/Interpreters/executeQuery.cpp:1624: DB::executeQuery(String const&, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum) @ 0x000000002aabd025 E . (STD_EXCEPTION) E (query: SHOW TABLES FROM complex_schema LIKE 'schema_with_complex_tables%') helpers/client.py:248: QueryRuntimeException ---------------------------- Captured stdout setup ----------------------------- Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml ------------------------------ Captured log setup ------------------------------ 2025-06-13 14:17:17 [ 576 ] DEBUG : Command:[docker ps | wc -l] (cluster.py:121, run_and_check) 2025-06-13 14:17:17 [ 576 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-06-13 14:17:17 [ 576 ] DEBUG : No running containers (conftest.py:95, cleanup_environment) 2025-06-13 14:17:17 [ 576 ] DEBUG : Pruning Docker networks (conftest.py:97, cleanup_environment) 2025-06-13 14:17:17 [ 576 ] DEBUG : Command:[docker network prune --force] (cluster.py:121, run_and_check) 2025-06-13 14:17:17 [ 576 ] DEBUG : Command:[sysctl net.ipv4.ip_local_port_range='55000 65535'] (cluster.py:121, run_and_check) 2025-06-13 14:17:17 [ 576 ] DEBUG : Stdout:net.ipv4.ip_local_port_range = 55000 65535 (cluster.py:145, run_and_check) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_KERBEROS_KDC_TAG 9391ecdee8d7 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV CLICKHOUSE_TESTS_SERVER_BIN_PATH /clickhouse (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV MSAN_OPTIONS abort_on_error=1 poison_in_dtor=1 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV JAVA_TOOL_OPTIONS -Djdk.attach.allowAttachSelf=true (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV TSAN_OPTIONS halt_on_error=1 abort_on_error=1 history_size=7 memory_limit_mb=46080 second_deadlock_stack=1 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV HOSTNAME 71ebb15656d9 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV SHLVL 0 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV HOME /root (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV OLDPWD / (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_HELPER_TAG 5dc43a6382f0 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV PYTHONUNBUFFERED 1 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_PYTHON_BOTTLE_TAG d862517635bf (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV UBSAN_OPTIONS print_stacktrace=1 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV PYTEST_ADDOPTS --dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_backward_compatibility/test_functions.py::test_string_functions test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV COMPOSE_HTTP_TIMEOUT 600 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_MYSQL_PHP_CLIENT_TAG 88be89c1e3b6 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_DOTNET_CLIENT_TAG 11de0b29a15d (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV CLICKHOUSE_TESTS_CLIENT_BIN_PATH /clickhouse (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_MYSQL_JS_CLIENT_TAG 41ba7c2ec2a1 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV PATH /spark-3.3.2-bin-hadoop3/bin:/opt/gdb/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_KERBERIZED_HADOOP_TAG latest (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_CHANNEL stable (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_CLIENT_TIMEOUT 300 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_POSTGRESQL_JAVA_CLIENT_TAG a4eff5c7f4d6 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_NGINX_DAV_TAG b55ac9cd7519 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_MYSQL_GOLANG_CLIENT_TAG 9bec2a638e6e (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV PWD /ClickHouse/tests/integration (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_MYSQL_JAVA_CLIENT_TAG 766bff31cfe4 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV CLICKHOUSE_TESTS_BASE_CONFIG_DIR /clickhouse-config (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV TZ Etc/UTC (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV JAVA_PATH /usr/lib/jvm/java-11-openjdk-amd64/bin/java (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV DOCKER_BASE_TAG 5ccda723c1fc (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV SPARK_HOME /spark-3.3.2-bin-hadoop3 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV LC_CTYPE C.UTF-8 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV INTEGRATION_TESTS_RUN_ID 1 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV WORKER_FREE_PORTS 30000 30001 30002 30003 30004 30005 30006 30007 30008 30009 30010 30011 30012 30013 30014 30015 30016 30017 30018 30019 30020 30021 30022 30023 30024 30025 30026 30027 30028 30029 30030 30031 30032 30033 30034 30035 30036 30037 30038 30039 30040 30041 30042 30043 30044 30045 30046 30047 30048 30049 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV PYTEST_XDIST_TESTRUNUID fdcbb0e1377c4a7db6ed4daad75a2ea8 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV PYTEST_XDIST_WORKER gw0 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV PYTEST_XDIST_WORKER_COUNT 10 (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : ENV PYTEST_CURRENT_TEST test_database_delta/test.py::test_complex_table_schema (setup) (cluster.py:419, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : CLUSTER INIT base_config_dir:/clickhouse-config (cluster.py:719, __init__) 2025-06-13 14:17:17 [ 576 ] DEBUG : clickhouse_start_command: clickhouse server --config-file=/etc/clickhouse-server/{main_config_file} --log-file=/var/log/clickhouse-server/clickhouse-server.log --errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log (cluster.py:1656, add_instance) 2025-06-13 14:17:17 [ 576 ] DEBUG : Cluster name: project_name:roottestdatabasedelta-gw0. Added instance name:node1 tag:latest base_cmd:['docker', 'compose', '--env-file', '/ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env', '--project-name', 'roottestdatabasedelta-gw0', '--file', '/ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml'] docker_compose_yml_dir:/ClickHouse/tests/integration/helpers/../../../tests/integration/compose/ (cluster.py:1942, add_instance) 2025-06-13 14:17:17 [ 576 ] INFO : Starting cluster... (test.py:42, started_cluster) 2025-06-13 14:17:17 [ 576 ] INFO : Running tests in /ClickHouse/tests/integration/test_database_delta/test.py (cluster.py:2672, start) 2025-06-13 14:17:17 [ 576 ] DEBUG : Cluster start called. is_up=False (cluster.py:2679, start) 2025-06-13 14:17:17 [ 576 ] DEBUG : Docker networks for project roottestdatabasedelta-gw0 are NETWORK ID NAME DRIVER SCOPE (cluster.py:825, print_all_docker_pieces) 2025-06-13 14:17:17 [ 576 ] DEBUG : Docker containers for project roottestdatabasedelta-gw0 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:833, print_all_docker_pieces) 2025-06-13 14:17:17 [ 576 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw0 are DRIVER VOLUME NAME (cluster.py:841, print_all_docker_pieces) 2025-06-13 14:17:17 [ 576 ] DEBUG : Cleanup called (cluster.py:846, cleanup) 2025-06-13 14:17:17 [ 576 ] DEBUG : Docker networks for project roottestdatabasedelta-gw0 are NETWORK ID NAME DRIVER SCOPE (cluster.py:825, print_all_docker_pieces) 2025-06-13 14:17:17 [ 576 ] DEBUG : Docker containers for project roottestdatabasedelta-gw0 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:833, print_all_docker_pieces) 2025-06-13 14:17:17 [ 576 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw0 are DRIVER VOLUME NAME (cluster.py:841, print_all_docker_pieces) 2025-06-13 14:17:17 [ 576 ] DEBUG : Command:[docker container list --all --filter name='^/roottestdatabasedelta-gw0-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-06-13 14:17:17 [ 576 ] DEBUG : Unstopped containers: {} (cluster.py:860, cleanup) 2025-06-13 14:17:17 [ 576 ] DEBUG : No running containers for project: roottestdatabasedelta-gw0 (cluster.py:874, cleanup) 2025-06-13 14:17:17 [ 576 ] DEBUG : Trying to prune unused networks... (cluster.py:880, cleanup) 2025-06-13 14:17:17 [ 576 ] DEBUG : Trying to prune unused images... (cluster.py:896, cleanup) 2025-06-13 14:17:17 [ 576 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-06-13 14:17:17 [ 576 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:145, run_and_check) 2025-06-13 14:17:17 [ 576 ] DEBUG : Images pruned (cluster.py:899, cleanup) 2025-06-13 14:17:17 [ 576 ] DEBUG : Trying to prune unused volumes... (cluster.py:905, cleanup) 2025-06-13 14:17:17 [ 576 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-06-13 14:17:17 [ 576 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-06-13 14:17:17 [ 576 ] DEBUG : Volumes pruned: 1 (cluster.py:910, cleanup) 2025-06-13 14:17:17 [ 576 ] DEBUG : Setup directory for instance: node1 (cluster.py:2692, start) 2025-06-13 14:17:17 [ 576 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4536, create_dir) 2025-06-13 14:17:17 [ 576 ] DEBUG : Create directory for common tests configuration (cluster.py:4541, create_dir) 2025-06-13 14:17:17 [ 576 ] DEBUG : Copy common configuration from helpers (cluster.py:4561, create_dir) 2025-06-13 14:17:17 [ 576 ] DEBUG : Generate and write macros file (cluster.py:4613, create_dir) 2025-06-13 14:17:17 [ 576 ] DEBUG : Copy custom test config files [] to /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/configs/config.d (cluster.py:4649, create_dir) 2025-06-13 14:17:17 [ 576 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/database (cluster.py:4666, create_dir) 2025-06-13 14:17:17 [ 576 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/logs (cluster.py:4677, create_dir) 2025-06-13 14:17:17 [ 576 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log", "--"] (cluster.py:4758, create_dir) 2025-06-13 14:17:17 [ 576 ] DEBUG : Env {'ASAN_OPTIONS': 'use_sigaltstack=0', 'TSAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw'} stored in /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env (cluster.py:96, _create_env_file) 2025-06-13 14:17:17 [ 576 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-06-13 14:17:17 [ 576 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-06-13 14:17:17 [ 576 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-06-13 14:17:17 [ 576 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-06-13 14:17:17 [ 576 ] DEBUG : http://localhost:None "GET /version HTTP/1.1" 200 826 (connectionpool.py:547, _make_request) 2025-06-13 14:17:17 [ 576 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env --project-name roottestdatabasedelta-gw0 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml pull] (cluster.py:121, run_and_check) 2025-06-13 14:17:28 [ 576 ] DEBUG : Stderr: node1 Pulling (cluster.py:147, run_and_check) 2025-06-13 14:17:28 [ 576 ] DEBUG : Stderr: node1 Pulled (cluster.py:147, run_and_check) 2025-06-13 14:17:28 [ 576 ] DEBUG : ('Trying to create ClickHouse instance by command %s', 'docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env --project-name roottestdatabasedelta-gw0 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml up -d --no-recreate') (cluster.py:3061, start) 2025-06-13 14:17:28 [ 576 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env --project-name roottestdatabasedelta-gw0 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml up -d --no-recreate] (cluster.py:121, run_and_check) 2025-06-13 14:17:28 [ 576 ] DEBUG : Stderr: Network roottestdatabasedelta-gw0_default Creating (cluster.py:147, run_and_check) 2025-06-13 14:17:28 [ 576 ] DEBUG : Stderr: Network roottestdatabasedelta-gw0_default Created (cluster.py:147, run_and_check) 2025-06-13 14:17:28 [ 576 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Creating (cluster.py:147, run_and_check) 2025-06-13 14:17:28 [ 576 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Created (cluster.py:147, run_and_check) 2025-06-13 14:17:28 [ 576 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Starting (cluster.py:147, run_and_check) 2025-06-13 14:17:28 [ 576 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Started (cluster.py:147, run_and_check) 2025-06-13 14:17:28 [ 576 ] DEBUG : ClickHouse instance created (cluster.py:3069, start) 2025-06-13 14:17:28 [ 576 ] DEBUG : get_instance_ip instance_name=node1 (cluster.py:1999, get_instance_ip) 2025-06-13 14:17:28 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw0-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:28 [ 576 ] DEBUG : get_instance_ip instance_name=node1 (cluster.py:2009, get_instance_global_ipv6) 2025-06-13 14:17:28 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw0-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:28 [ 576 ] DEBUG : Waiting for ClickHouse start in node1, ip: 172.16.1.2... (cluster.py:3077, start) 2025-06-13 14:17:28 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw0-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:28 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:28 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:28 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:29 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:29 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:29 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:29 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:29 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:29 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:29 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:29 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:29 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:29 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:30 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:30 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:30 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:30 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:30 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:30 [ 576 ] DEBUG : http://localhost:None "GET /v1.46/containers/b9b67e937f9c3fa2092aa5e3ddf13dc412aee1dd56ffcc9218f5f5ebf9f87129/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-06-13 14:17:30 [ 576 ] DEBUG : ClickHouse node1 started (cluster.py:3081, start) 2025-06-13 14:17:30 [ 576 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:False cmd: ['bash', '-c', 'cd /unitycatalog && nohup bin/start-uc-server &'] (cluster.py:2045, exec_in_container) 2025-06-13 14:17:30 [ 576 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /unitycatalog && nohup bin/start-uc-server &] (cluster.py:121, run_and_check) ------------------------------ Captured log call ------------------------------- 2025-06-13 14:17:32 [ 576 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE SCHEMA schema_with_complex_tables" | grep -v \'loading settings\'\n'] (cluster.py:2045, exec_in_container) 2025-06-13 14:17:32 [ 576 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE SCHEMA schema_with_complex_tables" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-45afac04-b840-4e3d-936d-347d6c1c9da7;1.0 (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr::: resolution report :: resolve 4369ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-06-13 14:17:39 [ 576 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE TABLE schema_with_complex_tables.complex_table (event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT) using Delta location \'/tmp/complex_schema/complex_table\'" | grep -v \'loading settings\'\n'] (cluster.py:2045, exec_in_container) 2025-06-13 14:17:39 [ 576 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE TABLE schema_with_complex_tables.complex_table (event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT) using Delta location '/tmp/complex_schema/complex_table'" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-107ab339-4588-4405-b0df-556404b9ec5a;1.0 (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr::: resolution report :: resolve 4308ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-06-13 14:17:45 [ 576 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "insert into schema_with_complex_tables.complex_table SELECT to_date(\'2024-10-01\', \'yyyy-MM-dd\'), to_timestamp(\'2024-10-01 00:12:00\'), array(42, 123, 77), map(7, \'v7\', 5, \'v5\'), named_struct(\\"f1\\", 34, \\"f2\\", \'hello\')" | grep -v \'loading settings\'\n'] (cluster.py:2045, exec_in_container) 2025-06-13 14:17:45 [ 576 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "insert into schema_with_complex_tables.complex_table SELECT to_date('2024-10-01', 'yyyy-MM-dd'), to_timestamp('2024-10-01 00:12:00'), array(42, 123, 77), map(7, 'v7', 5, 'v5'), named_struct(\"f1\", 34, \"f2\", 'hello')" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-879ec576-9466-4abf-910a-c2b3e59f10a8;1.0 (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr::: resolution report :: resolve 4286ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-06-13 14:17:51 [ 576 ] DEBUG : Executing query create database complex_schema engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3570, query) 2025-06-13 14:17:51 [ 576 ] DEBUG : Executing query SHOW TABLES FROM complex_schema LIKE 'schema_with_complex_tables%' on node1 (cluster.py:3570, query) ______________________ test_embedded_database_and_tables _______________________ [gw0] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_embedded_database_and_tables(started_cluster): node1 = started_cluster.instances['node1'] node1.query("create database unity_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) > default_tables = list(sorted(node1.query("SHOW TABLES FROM unity_test LIKE 'default%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) test_database_delta/test.py:80: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:3571: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_answer(self): self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) self.stdout_file.seek(0) self.stderr_file.seek(0) stdout = self.stdout_file.read().decode("utf-8", errors="replace") stderr = self.stderr_file.read().decode("utf-8", errors="replace") if ( self.timer is not None and not self.process_finished_before_timeout and not self.ignore_error ): logging.debug(f"Timed out. Last stdout:{stdout}, stderr:{stderr}") raise QueryTimeoutExceedException("Client timed out!") if ( self.process.returncode != 0 or self.remove_trash_from_stderr(stderr) ) and not self.ignore_error: > raise QueryRuntimeException( "Client failed! Return code: {}, stderr: {}".format( self.process.returncode, stderr ), self.process.returncode, stderr, ) E helpers.client.QueryRuntimeException: Client failed! Return code: 233, stderr: Received exception from server (version 25.3.3): E Code: 1001. DB::Exception: Received from 172.16.1.2:9000. DB::Exception: std::__1::filesystem::filesystem_error: filesystem error: in directory_iterator::directory_iterator(...): No such file or directory ["/tmp/marksheet_uniform/_delta_log"]. Stack trace: E E 0. ./contrib/llvm-project/libcxx/include/__exception/exception.h:106: std::runtime_error::runtime_error(String const&) @ 0x000000003fabed4e E 1. ./build_docker/./contrib/llvm-project/libcxx/src/system_error.cpp:195: std::system_error::system_error(std::error_code, String const&) @ 0x000000003fad3d7a E 2. ./contrib/llvm-project/libcxx/include/__filesystem/filesystem_error.h:38: std::filesystem::filesystem_error::filesystem_error[abi:ne190107](String const&, std::filesystem::path const&, std::error_code) @ 0x000000001bdf8b03 E 3. ./contrib/llvm-project/libcxx/include/__filesystem/filesystem_error.h:74: void std::filesystem::__throw_filesystem_error[abi:ne190107](String&, std::filesystem::path const&, std::error_code const&) @ 0x000000003f9eb21d E 4. ./contrib/llvm-project/libcxx/src/filesystem/error.h:160: std::filesystem::detail::ErrorHandler::report(std::error_code const&) const @ 0x000000003f9e92fa E 5. ./build_docker/./contrib/llvm-project/libcxx/src/filesystem/directory_iterator.cpp:180: std::filesystem::directory_iterator::directory_iterator(std::filesystem::path const&, std::error_code*, std::filesystem::directory_options) @ 0x000000003f9e6a3a E 6. ./contrib/llvm-project/libcxx/include/__filesystem/directory_iterator.h:53: DB::LocalObjectStorage::listObjects(String const&, std::vector, std::allocator>>&, unsigned long) const @ 0x0000000026aef73c E 7. ./build_docker/./src/Storages/ObjectStorage/DataLakes/Common.cpp:18: DB::listFiles(DB::IObjectStorage const&, DB::StorageObjectStorage::Configuration const&, String const&, String const&) @ 0x00000000267bb5ff E 8. ./build_docker/./src/Storages/ObjectStorage/DataLakes/DeltaLakeMetadata.cpp:184: DB::DeltaLakeMetadataImpl::processMetadataFiles() const @ 0x0000000026774fe0 E 9. ./build_docker/./src/Storages/ObjectStorage/DataLakes/DeltaLakeMetadata.cpp:599: DB::DeltaLakeMetadata::DeltaLakeMetadata(std::shared_ptr, std::weak_ptr, std::shared_ptr) @ 0x000000002676e008 E 10. ./contrib/llvm-project/libcxx/include/__memory/unique_ptr.h:634: DB::DeltaLakeMetadata::create(std::shared_ptr, std::weak_ptr, std::shared_ptr) @ 0x000000002113f060 E 11. ./src/Storages/ObjectStorage/DataLakes/DataLakeConfiguration.h:235: DB::DataLakeConfiguration::updateMetadataObjectIfNeeded(std::shared_ptr, std::shared_ptr) @ 0x00000000282fda61 E 12. ./src/Storages/ObjectStorage/DataLakes/DataLakeConfiguration.h:65: DB::DataLakeConfiguration::update(std::shared_ptr, std::shared_ptr) @ 0x00000000282fd583 E 13. ./build_docker/./src/Storages/ObjectStorage/StorageObjectStorage.cpp:114: DB::StorageObjectStorage::StorageObjectStorage(std::shared_ptr, std::shared_ptr, std::shared_ptr, DB::StorageID const&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional, DB::LoadingStrictnessLevel, bool, std::shared_ptr, bool, bool, std::optional)::$_0::operator()() const @ 0x0000000026577415 E 14. ./build_docker/./src/Storages/ObjectStorage/StorageObjectStorage.cpp:133: DB::StorageObjectStorage::StorageObjectStorage(std::shared_ptr, std::shared_ptr, std::shared_ptr, DB::StorageID const&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional, DB::LoadingStrictnessLevel, bool, std::shared_ptr, bool, bool, std::optional) @ 0x0000000026575783 E 15. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:41: DB::StorageObjectStorage* std::construct_at[abi:ne190107]&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool, std::shared_ptr&, bool, bool, String&, DB::StorageObjectStorage*>(DB::StorageObjectStorage*, std::shared_ptr&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool&&, std::shared_ptr&, bool&&, bool&&, String&) @ 0x00000000265a8a57 E 16. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:49: std::shared_ptr std::allocate_shared[abi:ne190107], std::shared_ptr&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool, std::shared_ptr&, bool, bool, String&, 0>(std::allocator const&, std::shared_ptr&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool&&, std::shared_ptr&, bool&&, bool&&, String&) @ 0x00000000265a8456 E 17. ./contrib/llvm-project/libcxx/include/__memory/shared_ptr.h:851: DB::StorageObjectStorageCluster::StorageObjectStorageCluster(String const&, std::shared_ptr, std::shared_ptr, std::shared_ptr, DB::StorageID const&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional, DB::LoadingStrictnessLevel, std::shared_ptr) @ 0x0000000026597adf E 18. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:41: DB::StorageObjectStorageCluster* std::construct_at[abi:ne190107] const&, std::shared_ptr, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription, char const (&) [1], DB::FormatSettings, DB::LoadingStrictnessLevel, std::nullptr_t, DB::StorageObjectStorageCluster*>(DB::StorageObjectStorageCluster*, String&, std::shared_ptr const&, std::shared_ptr&&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription&&, char const (&) [1], DB::FormatSettings&&, DB::LoadingStrictnessLevel&&, std::nullptr_t&&) @ 0x00000000282ff781 E 19. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:49: std::shared_ptr std::allocate_shared[abi:ne190107], String&, std::shared_ptr const&, std::shared_ptr, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription, char const (&) [1], DB::FormatSettings, DB::LoadingStrictnessLevel, std::nullptr_t, 0>(std::allocator const&, String&, std::shared_ptr const&, std::shared_ptr&&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription&&, char const (&) [1], DB::FormatSettings&&, DB::LoadingStrictnessLevel&&, std::nullptr_t&&) @ 0x00000000282ff249 E 20. ./contrib/llvm-project/libcxx/include/__memory/shared_ptr.h:851: DB::DatabaseDataLake::tryGetTableImpl(String const&, std::shared_ptr, bool) const @ 0x00000000282ebfbc E 21. ./build_docker/./src/Databases/DataLake/DatabaseDataLake.cpp:462: DB::DatabaseDataLake::getLightweightTablesIterator(std::shared_ptr, std::function const&, bool) const @ 0x00000000282eeb9c E 22. ./build_docker/./src/Storages/System/StorageSystemTables.cpp:123: DB::detail::getFilteredTables(DB::ActionsDAG::Node const*, COW::immutable_ptr const&, std::shared_ptr, bool) @ 0x00000000212960dd E 23. ./build_docker/./src/Storages/System/StorageSystemTables.cpp:844: DB::ReadFromSystemTables::applyFilters(DB::ActionDAGNodes) @ 0x000000002129fff2 E 24. ./src/Processors/QueryPlan/SourceStepWithFilter.h:39: DB::SourceStepWithFilterBase::applyFilters() @ 0x0000000030ae0157 E 25. ./build_docker/./src/Processors/QueryPlan/Optimizations/optimizePrimaryKeyConditionAndLimit.cpp:55: DB::QueryPlanOptimizations::optimizePrimaryKeyConditionAndLimit(std::vector> const&) @ 0x0000000030adfc78 E 26. ./build_docker/./src/Processors/QueryPlan/Optimizations/optimizeTree.cpp:125: DB::QueryPlanOptimizations::optimizeTreeSecondPass(DB::QueryPlanOptimizationSettings const&, DB::QueryPlan::Node&, std::list>&) @ 0x0000000030adc361 E 27. ./build_docker/./src/Processors/QueryPlan/QueryPlan.cpp:487: DB::QueryPlan::buildQueryPipeline(DB::QueryPlanOptimizationSettings const&, DB::BuildQueryPipelineSettings const&) @ 0x000000003091ebbc E 28. ./build_docker/./src/Interpreters/InterpreterSelectQueryAnalyzer.cpp:275: DB::InterpreterSelectQueryAnalyzer::buildQueryPipeline() @ 0x000000002a33a102 E 29. ./build_docker/./src/Interpreters/InterpreterSelectQueryAnalyzer.cpp:242: DB::InterpreterSelectQueryAnalyzer::execute() @ 0x000000002a3398c7 E 30. ./build_docker/./src/Interpreters/executeQuery.cpp:1457: DB::executeQueryImpl(char const*, char const*, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum, DB::ReadBuffer*, std::shared_ptr&) @ 0x000000002aac50e3 E 31. ./build_docker/./src/Interpreters/executeQuery.cpp:1624: DB::executeQuery(String const&, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum) @ 0x000000002aabd025 E . (STD_EXCEPTION) E (query: SHOW TABLES FROM unity_test LIKE 'default%') helpers/client.py:248: QueryRuntimeException ------------------------------ Captured log call ------------------------------- 2025-06-13 14:17:53 [ 576 ] DEBUG : Executing query create database unity_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3570, query) 2025-06-13 14:17:53 [ 576 ] DEBUG : Executing query SHOW TABLES FROM unity_test LIKE 'default%' on node1 (cluster.py:3570, query) _________________________ test_multiple_schemes_tables _________________________ [gw0] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_multiple_schemes_tables(started_cluster): node1 = started_cluster.instances['node1'] execute_multiple_spark_queries(node1, [f'CREATE SCHEMA test_schema{i}' for i in range(10)], True) execute_multiple_spark_queries(node1, [f'CREATE TABLE test_schema{i}.test_table{i} (col1 int, col2 double) using Delta location \'/tmp/test_schema{i}/test_table{i}\'' for i in range(10)], True) execute_multiple_spark_queries(node1, [f'INSERT INTO test_schema{i}.test_table{i} VALUES ({i}, {i}.0)' for i in range(10)], True) node1.query("create database multi_schema_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) > multi_schema_tables = list(sorted(node1.query("SHOW TABLES FROM multi_schema_test LIKE 'test_schema%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) test_database_delta/test.py:103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:3571: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_answer(self): self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) self.stdout_file.seek(0) self.stderr_file.seek(0) stdout = self.stdout_file.read().decode("utf-8", errors="replace") stderr = self.stderr_file.read().decode("utf-8", errors="replace") if ( self.timer is not None and not self.process_finished_before_timeout and not self.ignore_error ): logging.debug(f"Timed out. Last stdout:{stdout}, stderr:{stderr}") raise QueryTimeoutExceedException("Client timed out!") if ( self.process.returncode != 0 or self.remove_trash_from_stderr(stderr) ) and not self.ignore_error: > raise QueryRuntimeException( "Client failed! Return code: {}, stderr: {}".format( self.process.returncode, stderr ), self.process.returncode, stderr, ) E helpers.client.QueryRuntimeException: Client failed! Return code: 233, stderr: Received exception from server (version 25.3.3): E Code: 1001. DB::Exception: Received from 172.16.1.2:9000. DB::Exception: std::__1::filesystem::filesystem_error: filesystem error: in directory_iterator::directory_iterator(...): No such file or directory ["/tmp/marksheet_uniform/_delta_log"]. Stack trace: E E 0. ./contrib/llvm-project/libcxx/include/__exception/exception.h:106: std::runtime_error::runtime_error(String const&) @ 0x000000003fabed4e E 1. ./build_docker/./contrib/llvm-project/libcxx/src/system_error.cpp:195: std::system_error::system_error(std::error_code, String const&) @ 0x000000003fad3d7a E 2. ./contrib/llvm-project/libcxx/include/__filesystem/filesystem_error.h:38: std::filesystem::filesystem_error::filesystem_error[abi:ne190107](String const&, std::filesystem::path const&, std::error_code) @ 0x000000001bdf8b03 E 3. ./contrib/llvm-project/libcxx/include/__filesystem/filesystem_error.h:74: void std::filesystem::__throw_filesystem_error[abi:ne190107](String&, std::filesystem::path const&, std::error_code const&) @ 0x000000003f9eb21d E 4. ./contrib/llvm-project/libcxx/src/filesystem/error.h:160: std::filesystem::detail::ErrorHandler::report(std::error_code const&) const @ 0x000000003f9e92fa E 5. ./build_docker/./contrib/llvm-project/libcxx/src/filesystem/directory_iterator.cpp:180: std::filesystem::directory_iterator::directory_iterator(std::filesystem::path const&, std::error_code*, std::filesystem::directory_options) @ 0x000000003f9e6a3a E 6. ./contrib/llvm-project/libcxx/include/__filesystem/directory_iterator.h:53: DB::LocalObjectStorage::listObjects(String const&, std::vector, std::allocator>>&, unsigned long) const @ 0x0000000026aef73c E 7. ./build_docker/./src/Storages/ObjectStorage/DataLakes/Common.cpp:18: DB::listFiles(DB::IObjectStorage const&, DB::StorageObjectStorage::Configuration const&, String const&, String const&) @ 0x00000000267bb5ff E 8. ./build_docker/./src/Storages/ObjectStorage/DataLakes/DeltaLakeMetadata.cpp:184: DB::DeltaLakeMetadataImpl::processMetadataFiles() const @ 0x0000000026774fe0 E 9. ./build_docker/./src/Storages/ObjectStorage/DataLakes/DeltaLakeMetadata.cpp:599: DB::DeltaLakeMetadata::DeltaLakeMetadata(std::shared_ptr, std::weak_ptr, std::shared_ptr) @ 0x000000002676e008 E 10. ./contrib/llvm-project/libcxx/include/__memory/unique_ptr.h:634: DB::DeltaLakeMetadata::create(std::shared_ptr, std::weak_ptr, std::shared_ptr) @ 0x000000002113f060 E 11. ./src/Storages/ObjectStorage/DataLakes/DataLakeConfiguration.h:235: DB::DataLakeConfiguration::updateMetadataObjectIfNeeded(std::shared_ptr, std::shared_ptr) @ 0x00000000282fda61 E 12. ./src/Storages/ObjectStorage/DataLakes/DataLakeConfiguration.h:65: DB::DataLakeConfiguration::update(std::shared_ptr, std::shared_ptr) @ 0x00000000282fd583 E 13. ./build_docker/./src/Storages/ObjectStorage/StorageObjectStorage.cpp:114: DB::StorageObjectStorage::StorageObjectStorage(std::shared_ptr, std::shared_ptr, std::shared_ptr, DB::StorageID const&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional, DB::LoadingStrictnessLevel, bool, std::shared_ptr, bool, bool, std::optional)::$_0::operator()() const @ 0x0000000026577415 E 14. ./build_docker/./src/Storages/ObjectStorage/StorageObjectStorage.cpp:133: DB::StorageObjectStorage::StorageObjectStorage(std::shared_ptr, std::shared_ptr, std::shared_ptr, DB::StorageID const&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional, DB::LoadingStrictnessLevel, bool, std::shared_ptr, bool, bool, std::optional) @ 0x0000000026575783 E 15. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:41: DB::StorageObjectStorage* std::construct_at[abi:ne190107]&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool, std::shared_ptr&, bool, bool, String&, DB::StorageObjectStorage*>(DB::StorageObjectStorage*, std::shared_ptr&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool&&, std::shared_ptr&, bool&&, bool&&, String&) @ 0x00000000265a8a57 E 16. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:49: std::shared_ptr std::allocate_shared[abi:ne190107], std::shared_ptr&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool, std::shared_ptr&, bool, bool, String&, 0>(std::allocator const&, std::shared_ptr&, std::shared_ptr const&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional&, DB::LoadingStrictnessLevel&, bool&&, std::shared_ptr&, bool&&, bool&&, String&) @ 0x00000000265a8456 E 17. ./contrib/llvm-project/libcxx/include/__memory/shared_ptr.h:851: DB::StorageObjectStorageCluster::StorageObjectStorageCluster(String const&, std::shared_ptr, std::shared_ptr, std::shared_ptr, DB::StorageID const&, DB::ColumnsDescription const&, DB::ConstraintsDescription const&, String const&, std::optional, DB::LoadingStrictnessLevel, std::shared_ptr) @ 0x0000000026597adf E 18. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:41: DB::StorageObjectStorageCluster* std::construct_at[abi:ne190107] const&, std::shared_ptr, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription, char const (&) [1], DB::FormatSettings, DB::LoadingStrictnessLevel, std::nullptr_t, DB::StorageObjectStorageCluster*>(DB::StorageObjectStorageCluster*, String&, std::shared_ptr const&, std::shared_ptr&&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription&&, char const (&) [1], DB::FormatSettings&&, DB::LoadingStrictnessLevel&&, std::nullptr_t&&) @ 0x00000000282ff781 E 19. ./contrib/llvm-project/libcxx/include/__memory/construct_at.h:49: std::shared_ptr std::allocate_shared[abi:ne190107], String&, std::shared_ptr const&, std::shared_ptr, std::shared_ptr&, DB::StorageID, DB::ColumnsDescription const&, DB::ConstraintsDescription, char const (&) [1], DB::FormatSettings, DB::LoadingStrictnessLevel, std::nullptr_t, 0>(std::allocator const&, String&, std::shared_ptr const&, std::shared_ptr&&, std::shared_ptr&, DB::StorageID&&, DB::ColumnsDescription const&, DB::ConstraintsDescription&&, char const (&) [1], DB::FormatSettings&&, DB::LoadingStrictnessLevel&&, std::nullptr_t&&) @ 0x00000000282ff249 E 20. ./contrib/llvm-project/libcxx/include/__memory/shared_ptr.h:851: DB::DatabaseDataLake::tryGetTableImpl(String const&, std::shared_ptr, bool) const @ 0x00000000282ebfbc E 21. ./build_docker/./src/Databases/DataLake/DatabaseDataLake.cpp:462: DB::DatabaseDataLake::getLightweightTablesIterator(std::shared_ptr, std::function const&, bool) const @ 0x00000000282eeb9c E 22. ./build_docker/./src/Storages/System/StorageSystemTables.cpp:123: DB::detail::getFilteredTables(DB::ActionsDAG::Node const*, COW::immutable_ptr const&, std::shared_ptr, bool) @ 0x00000000212960dd E 23. ./build_docker/./src/Storages/System/StorageSystemTables.cpp:844: DB::ReadFromSystemTables::applyFilters(DB::ActionDAGNodes) @ 0x000000002129fff2 E 24. ./src/Processors/QueryPlan/SourceStepWithFilter.h:39: DB::SourceStepWithFilterBase::applyFilters() @ 0x0000000030ae0157 E 25. ./build_docker/./src/Processors/QueryPlan/Optimizations/optimizePrimaryKeyConditionAndLimit.cpp:55: DB::QueryPlanOptimizations::optimizePrimaryKeyConditionAndLimit(std::vector> const&) @ 0x0000000030adfc78 E 26. ./build_docker/./src/Processors/QueryPlan/Optimizations/optimizeTree.cpp:125: DB::QueryPlanOptimizations::optimizeTreeSecondPass(DB::QueryPlanOptimizationSettings const&, DB::QueryPlan::Node&, std::list>&) @ 0x0000000030adc361 E 27. ./build_docker/./src/Processors/QueryPlan/QueryPlan.cpp:487: DB::QueryPlan::buildQueryPipeline(DB::QueryPlanOptimizationSettings const&, DB::BuildQueryPipelineSettings const&) @ 0x000000003091ebbc E 28. ./build_docker/./src/Interpreters/InterpreterSelectQueryAnalyzer.cpp:275: DB::InterpreterSelectQueryAnalyzer::buildQueryPipeline() @ 0x000000002a33a102 E 29. ./build_docker/./src/Interpreters/InterpreterSelectQueryAnalyzer.cpp:242: DB::InterpreterSelectQueryAnalyzer::execute() @ 0x000000002a3398c7 E 30. ./build_docker/./src/Interpreters/executeQuery.cpp:1457: DB::executeQueryImpl(char const*, char const*, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum, DB::ReadBuffer*, std::shared_ptr&) @ 0x000000002aac50e3 E 31. ./build_docker/./src/Interpreters/executeQuery.cpp:1624: DB::executeQuery(String const&, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum) @ 0x000000002aabd025 E . (STD_EXCEPTION) E (query: SHOW TABLES FROM multi_schema_test LIKE 'test_schema%') helpers/client.py:248: QueryRuntimeException ------------------------------ Captured log call ------------------------------- 2025-06-13 14:17:54 [ 576 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE SCHEMA test_schema0;CREATE SCHEMA test_schema1;CREATE SCHEMA test_schema2;CREATE SCHEMA test_schema3;CREATE SCHEMA test_schema4;CREATE SCHEMA test_schema5;CREATE SCHEMA test_schema6;CREATE SCHEMA test_schema7;CREATE SCHEMA test_schema8;CREATE SCHEMA test_schema9" | grep -v \'loading settings\'\n'] (cluster.py:2045, exec_in_container) 2025-06-13 14:17:54 [ 576 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE SCHEMA test_schema0;CREATE SCHEMA test_schema1;CREATE SCHEMA test_schema2;CREATE SCHEMA test_schema3;CREATE SCHEMA test_schema4;CREATE SCHEMA test_schema5;CREATE SCHEMA test_schema6;CREATE SCHEMA test_schema7;CREATE SCHEMA test_schema8;CREATE SCHEMA test_schema9" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-6a49dc11-d7be-4d3e-8307-ee05c4092765;1.0 (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr::: resolution report :: resolve 4249ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-06-13 14:18:00 [ 576 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE TABLE test_schema0.test_table0 (col1 int, col2 double) using Delta location \'/tmp/test_schema0/test_table0\';CREATE TABLE test_schema1.test_table1 (col1 int, col2 double) using Delta location \'/tmp/test_schema1/test_table1\';CREATE TABLE test_schema2.test_table2 (col1 int, col2 double) using Delta location \'/tmp/test_schema2/test_table2\';CREATE TABLE test_schema3.test_table3 (col1 int, col2 double) using Delta location \'/tmp/test_schema3/test_table3\';CREATE TABLE test_schema4.test_table4 (col1 int, col2 double) using Delta location \'/tmp/test_schema4/test_table4\';CREATE TABLE test_schema5.test_table5 (col1 int, col2 double) using Delta location \'/tmp/test_schema5/test_table5\';CREATE TABLE test_schema6.test_table6 (col1 int, col2 double) using Delta location \'/tmp/test_schema6/test_table6\';CREATE TABLE test_schema7.test_table7 (col1 int, col2 double) using Delta location \'/tmp/test_schema7/test_table7\';CREATE TABLE test_schema8.test_table8 (col1 int, col2 double) using Delta location \'/tmp/test_schema8/test_table8\';CREATE TABLE test_schema9.test_table9 (col1 int, col2 double) using Delta location \'/tmp/test_schema9/test_table9\'" | grep -v \'loading settings\'\n'] (cluster.py:2045, exec_in_container) 2025-06-13 14:18:00 [ 576 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE TABLE test_schema0.test_table0 (col1 int, col2 double) using Delta location '/tmp/test_schema0/test_table0';CREATE TABLE test_schema1.test_table1 (col1 int, col2 double) using Delta location '/tmp/test_schema1/test_table1';CREATE TABLE test_schema2.test_table2 (col1 int, col2 double) using Delta location '/tmp/test_schema2/test_table2';CREATE TABLE test_schema3.test_table3 (col1 int, col2 double) using Delta location '/tmp/test_schema3/test_table3';CREATE TABLE test_schema4.test_table4 (col1 int, col2 double) using Delta location '/tmp/test_schema4/test_table4';CREATE TABLE test_schema5.test_table5 (col1 int, col2 double) using Delta location '/tmp/test_schema5/test_table5';CREATE TABLE test_schema6.test_table6 (col1 int, col2 double) using Delta location '/tmp/test_schema6/test_table6';CREATE TABLE test_schema7.test_table7 (col1 int, col2 double) using Delta location '/tmp/test_schema7/test_table7';CREATE TABLE test_schema8.test_table8 (col1 int, col2 double) using Delta location '/tmp/test_schema8/test_table8';CREATE TABLE test_schema9.test_table9 (col1 int, col2 double) using Delta location '/tmp/test_schema9/test_table9'" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-e7ed48ab-3918-4a16-ba17-37103b3f826d;1.0 (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr::: resolution report :: resolve 4302ms :: artifacts dl 1ms (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-06-13 14:18:05 [ 576 ] DEBUG : run container_id:roottestdatabasedelta-gw0-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "INSERT INTO test_schema0.test_table0 VALUES (0, 0.0);INSERT INTO test_schema1.test_table1 VALUES (1, 1.0);INSERT INTO test_schema2.test_table2 VALUES (2, 2.0);INSERT INTO test_schema3.test_table3 VALUES (3, 3.0);INSERT INTO test_schema4.test_table4 VALUES (4, 4.0);INSERT INTO test_schema5.test_table5 VALUES (5, 5.0);INSERT INTO test_schema6.test_table6 VALUES (6, 6.0);INSERT INTO test_schema7.test_table7 VALUES (7, 7.0);INSERT INTO test_schema8.test_table8 VALUES (8, 8.0);INSERT INTO test_schema9.test_table9 VALUES (9, 9.0)" | grep -v \'loading settings\'\n'] (cluster.py:2045, exec_in_container) 2025-06-13 14:18:05 [ 576 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw0-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "INSERT INTO test_schema0.test_table0 VALUES (0, 0.0);INSERT INTO test_schema1.test_table1 VALUES (1, 1.0);INSERT INTO test_schema2.test_table2 VALUES (2, 2.0);INSERT INTO test_schema3.test_table3 VALUES (3, 3.0);INSERT INTO test_schema4.test_table4 VALUES (4, 4.0);INSERT INTO test_schema5.test_table5 VALUES (5, 5.0);INSERT INTO test_schema6.test_table6 VALUES (6, 6.0);INSERT INTO test_schema7.test_table7 VALUES (7, 7.0);INSERT INTO test_schema8.test_table8 VALUES (8, 8.0);INSERT INTO test_schema9.test_table9 VALUES (9, 9.0)" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-a7f80c5f-9f8f-4c69-b0ae-b6c4a127a0da;1.0 (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr::: resolution report :: resolve 4240ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-06-13 14:18:11 [ 576 ] DEBUG : Executing query create database multi_schema_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3570, query) 2025-06-13 14:18:12 [ 576 ] DEBUG : Executing query SHOW TABLES FROM multi_schema_test LIKE 'test_schema%' on node1 (cluster.py:3570, query) ---------------------------- Captured log teardown ----------------------------- 2025-06-13 14:18:12 [ 576 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env --project-name roottestdatabasedelta-gw0 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml stop --timeout 20] (cluster.py:121, run_and_check) 2025-06-13 14:18:18 [ 576 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Stopping (cluster.py:147, run_and_check) 2025-06-13 14:18:18 [ 576 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Stopped (cluster.py:147, run_and_check) 2025-06-13 14:18:18 [ 576 ] DEBUG : Command:[bash -c [ -f /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/logs/stderr.log* | ( [ -z "" ] && cat || grep -v "$" ) || true] (cluster.py:121, run_and_check) 2025-06-13 14:18:18 [ 576 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/.env --project-name roottestdatabasedelta-gw0 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw0/node1/docker-compose.yml down --volumes] (cluster.py:121, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Stopping (cluster.py:147, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Stopped (cluster.py:147, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Removing (cluster.py:147, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Stderr: Container roottestdatabasedelta-gw0-node1-1 Removed (cluster.py:147, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Stderr: Network roottestdatabasedelta-gw0_default Removing (cluster.py:147, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Stderr: Network roottestdatabasedelta-gw0_default Removed (cluster.py:147, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Cleanup called (cluster.py:846, cleanup) 2025-06-13 14:18:19 [ 576 ] DEBUG : Docker networks for project roottestdatabasedelta-gw0 are NETWORK ID NAME DRIVER SCOPE (cluster.py:825, print_all_docker_pieces) 2025-06-13 14:18:19 [ 576 ] DEBUG : Docker containers for project roottestdatabasedelta-gw0 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:833, print_all_docker_pieces) 2025-06-13 14:18:19 [ 576 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw0 are DRIVER VOLUME NAME (cluster.py:841, print_all_docker_pieces) 2025-06-13 14:18:19 [ 576 ] DEBUG : Command:[docker container list --all --filter name='^/roottestdatabasedelta-gw0-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Unstopped containers: {} (cluster.py:860, cleanup) 2025-06-13 14:18:19 [ 576 ] DEBUG : No running containers for project: roottestdatabasedelta-gw0 (cluster.py:874, cleanup) 2025-06-13 14:18:19 [ 576 ] DEBUG : Trying to prune unused networks... (cluster.py:880, cleanup) 2025-06-13 14:18:19 [ 576 ] DEBUG : Trying to prune unused images... (cluster.py:896, cleanup) 2025-06-13 14:18:19 [ 576 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:145, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Images pruned (cluster.py:899, cleanup) 2025-06-13 14:18:19 [ 576 ] DEBUG : Trying to prune unused volumes... (cluster.py:905, cleanup) 2025-06-13 14:18:19 [ 576 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-06-13 14:18:19 [ 576 ] DEBUG : Volumes pruned: 1 (cluster.py:910, cleanup) ----------------- generated report log file: parallel0_1.jsonl ----------------- ============================== slowest durations =============================== 20.43s call test_database_delta/test.py::test_complex_table_schema 18.49s call test_database_delta/test.py::test_multiple_schemes_tables 15.47s setup test_database_delta/test.py::test_complex_table_schema 13.53s setup test_backward_compatibility/test_functions.py::test_string_functions 6.90s teardown test_database_delta/test.py::test_multiple_schemes_tables 4.15s teardown test_backward_compatibility/test_functions.py::test_string_functions 0.90s call test_backward_compatibility/test_functions.py::test_string_functions 0.68s call test_database_delta/test.py::test_embedded_database_and_tables 0.00s teardown test_database_delta/test.py::test_complex_table_schema 0.00s teardown test_database_delta/test.py::test_embedded_database_and_tables 0.00s setup test_database_delta/test.py::test_embedded_database_and_tables 0.00s setup test_database_delta/test.py::test_multiple_schemes_tables =========================== short test summary info ============================ FAILED test_database_delta/test.py::test_complex_table_schema - helpers.clien... FAILED test_database_delta/test.py::test_embedded_database_and_tables - helpe... FAILED test_database_delta/test.py::test_multiple_schemes_tables - helpers.cl... SKIPPED [1] test_backward_compatibility/test_functions.py:164: The test is slow in builds with sanitizer =================== 3 failed, 1 skipped in 64.50s (0:01:04) ==================== Traceback (most recent call last): File "/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration/./runner", line 492, in subprocess.check_call(cmd, shell=True, bufsize=0) File "/usr/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'docker run --rm --name clickhouse_integration_tests_tu7fxk --privileged --dns-search='.' --memory=30709030912 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/backupview:/ClickHouse/utils/backupview --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/grpc-client/pb2:/ClickHouse/utils/grpc-client/pb2 --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_DOTNET_CLIENT_TAG=11de0b29a15d -e DOCKER_HELPER_TAG=5dc43a6382f0 -e DOCKER_BASE_TAG=5ccda723c1fc -e DOCKER_KERBEROS_KDC_TAG=9391ecdee8d7 -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=9bec2a638e6e -e DOCKER_MYSQL_JAVA_CLIENT_TAG=766bff31cfe4 -e DOCKER_MYSQL_JS_CLIENT_TAG=41ba7c2ec2a1 -e DOCKER_MYSQL_PHP_CLIENT_TAG=88be89c1e3b6 -e DOCKER_NGINX_DAV_TAG=b55ac9cd7519 -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=a4eff5c7f4d6 -e DOCKER_PYTHON_BOTTLE_TAG=d862517635bf -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e PYTHONUNBUFFERED=1 -e PYTEST_ADDOPTS="--dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_backward_compatibility/test_functions.py::test_string_functions test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv " altinityinfra/integration-tests-runner:ad96270260ff ' returned non-zero exit status 1.